Exact upper and lower bounds on the misclassification probability

نویسنده

  • Iosif Pinelis
چکیده

Abstract. A lower bound on the misclassification probability for a finite number of classes is obtained in terms of the total variation norms of the differences between the sub-distributions over the classes. This bound, which is shown to be exact in a certain rather strong sense, is based on an exact upper bound on the difference between the maximum and the mean of a finite set of real numbers, in terms of the sum of the absolute values of the pairwise differences between the numbers.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Exact maximum coverage probabilities of confidence intervals with increasing bounds for Poisson distribution mean

 ‎A Poisson distribution is well used as a standard model for analyzing count data‎. ‎So the Poisson distribution parameter estimation is widely applied in practice‎. ‎Providing accurate confidence intervals for the discrete distribution parameters is very difficult‎. ‎So far‎, ‎many asymptotic confidence intervals for the mean of Poisson distribution is provided‎. ‎It is known that the coverag...

متن کامل

A Sharp Sufficient Condition for Sparsity Pattern Recovery

Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...

متن کامل

Upper and lower bounds of symmetric division deg index

Symmetric Division Deg index is one of the 148 discrete Adriatic indices that showed good predictive properties on the testing sets provided by International Academy of Mathematical Chemistry. Symmetric Division Deg index is defined by $$ SDD(G) = sumE left( frac{min{d_u,d_v}}{max{d_u,d_v}} + frac{max{d_u,d_v}}{min{d_u,d_v}} right), $$ where $d_i$ is the degree of vertex $i$ in graph $G$. In th...

متن کامل

Analytical Bounds between Entropy and Error Probability in Binary Classifications

The existing upper and lower bounds between entropy and error probability are mostly derived from the inequality of the entropy relations, which could introduce approximations into the analysis. We derive analytical bounds based on the closed-form solutions of conditional entropy without involving any approximation. Two basic types of classification errors are investigated in the context of bin...

متن کامل

Lower and Upper Bounds for Misclassification Probability Based on Renyi's Information

Fano’s inequality has proven to be one important result in Shannon’s information theory having found applications in innumerous proofs of convergence. It also provides us with a lower bound on the symbol error probability in a communication channel, in terms of Shannon’s definitions of entropy and mutual information. This result is also significant in that it suggests insights on how the classi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017